Stein Variational Gradient Descent on Infinite-Dimensional Space and Applications to Statistical Inverse Problems

نویسندگان

چکیده

In this paper, we propose an infinite-dimensional version of the Stein variational gradient descent (iSVGD) method for solving Bayesian inverse problems. The can generate approximate samples from posteriors efficiently. Based on concepts operator-valued kernels and vector-valued reproducing kernel Hilbert spaces, a rigorous definition is given objects, e.g., operator, which are proved to be limit finite-dimensional ones. Moreover, more efficient iSVGD with preconditioning operators constructed by generalizing change variables formula introducing regularity parameter. proposed algorithms applied problem steady state Darcy flow equation. Numerical results confirm our theoretical findings demonstrate potential applications approach in posterior sampling large-scale nonlinear statistical

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stein Variational Gradient Descent: Theory and Applications

Although optimization can be done very efficiently using gradient-based optimization these days, Bayesian inference or probabilistic sampling has been considered to be much more difficult. Stein variational gradient descent (SVGD) is a new particle-based inference method derived using a functional gradient descent for minimizing KL divergence without explicit parametric assumptions. SVGD can be...

متن کامل

Stein Variational Gradient Descent as Gradient Flow

Stein variational gradient descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate given distributions, based on a gradient-based update that guarantees to optimally decrease the KL divergence within a function space. This paper develops the first theoretical analysis on SVGD. We establish that the empirical measures of the SVGD samples...

متن کامل

VAE Learning via Stein Variational Gradient Descent

A new method for learning variational autoencoders (VAEs) is developed, based on Stein variational gradient descent. A key advantage of this approach is that one need not make parametric assumptions about the form of the encoder distribution. Performance is further enhanced by integrating the proposed encoder with importance sampling. Excellent performance is demonstrated across multiple unsupe...

متن کامل

Learning to Draw Samples with Amortized Stein Variational Gradient Descent

We propose a simple algorithm to train stochastic neural networks to draw samples from given target distributions for probabilistic inference. Our method is based on iteratively adjusting the neural network parameters so that the output changes along a Stein variational gradient direction (Liu & Wang, 2016) that maximally decreases the KL divergence with the target distribution. Our method work...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Numerical Analysis

سال: 2022

ISSN: ['0036-1429', '1095-7170']

DOI: https://doi.org/10.1137/21m1440773